This Southwest Virginia boy is fed up with the world and America. Daily I hear and see things that blow me away both in general conversations and on television and radio. It seems to me as if this country has gotten to the point where we are so afraid we are going to offend somebody we don't stand for anything. As a boy growing up there were a few things that were ingrained in me. I hold these things to be very valuable and a general tool for my life. My parents taught me to be respectful, over the last week I have seen an open homosexual (and don't go nuts if a straight person did this I'd think it was wrong as well) act like he was performing sexual intercourse on another male while trying to pick up a package at a post office. How and why is this acceptable? I fully believe that if my grandparents would have seen this type of behavior they would have commented or forced the action to be stopped. Today, and I'm guilty we do nothing, people at the post office laughed as if it was some big joke. African Americans, blacks, or whatever the correct term is that I'm supposed to use today are an interesting group in my opinion. Some of them find it acceptable to call each other niggers, I hear it at least four to five times a day. I personally find the word offensive and racist, so why do they use it towards each other? While I'm on this subject I'm going to go ahead and address another issue, last week I went and voted and heard numerous jokes about Obama and the infamous word of nigger from white people. I was equally as offended and like the Blacks/African Americans I lose respect for them each time I hear the word. I also wonder what exactly is going on in our understanding of human nature. Men and Women are different, in the basic sense men are stronger, women are more nurturing. Yet we try to interchange women and men as to make them one in the same? Going from my own personal experience as a child my father worked out of town and I only saw him on the weekends for a period of time, so my mother took care of me. To me, this was the logical and correct choice, she taught me basic rights and wrongs what to do and what not to do, etc. My father at the point where I was getting older taught me how to be a man, how to treat women, the birds and the bees, etc. Yet out of my good friends only a small fraction of them have both parents, some never knew their father, and some had mother bring home money while father was house keeper. Saturday at a football game a college student in the stands started screaming (so much so I could hear it on the sideline) that the official had a "boner"?. For those of you that don't know (how you wouldn't in today's world I wouldn't know) that is a slang term for a male erection. For a female to engage in that type of language is horrific in my opinion, to me it's terrible if a man is screaming it a public event where there are children and elderly around. Yet again, nobody in the crowd did anything we just carried on like there wasn't anything going on. I was reading on this board about homosexuality and it got me to thinking about writing this. To me, if we as a society want to start allowing homosexuals to marry, adopt kids, and be "open"? in our society as we are moving towards it's just another step in the wrong direction. Many kids don't have fathers now, don't compound the problem by giving them two mothers or one man and a "metro sexual"?. Has America lost it's common sense and values? I personally couldn't give a damn if your white, black, red, yellow, brown, or whatever what I do care is that we as a country get back some of our respect and honor. It's not a black thing, it's not a white thing, it's an American thing. Men need to be men, women need to be women, and we need to teach our youth how to act in a responsible manner, if not those of us who do actually care will end up feeling worse than what we now.